Factorized Asymptotic Bayesian Inference for Mixture Modeling
نویسندگان
چکیده
This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorized asymptotic Bayesian inference (FAB), which maximizes an asymptotically-consistent lower bound of FIC. FIC and FAB have several desirable properties: 1) asymptotic consistency with the marginal log-likelihood, 2) automatic component selection on the basis of an intrinsic shrinkage mechanism, and 3) parameter identifiability in mixture modeling. Experimental results show that FAB outperforms state-of-the-art VB methods.
منابع مشابه
Factorized Asymptotic Bayesian Inference for Latent Feature Models
This paper extends factorized asymptotic Bayesian (FAB) inference for latent feature models (LFMs). FAB inference has not been applicable to models, including LFMs, without a specific condition on the Hessian matrix of a complete loglikelihood, which is required to derive a “factorized information criterion” (FIC). Our asymptotic analysis of the Hessian matrix of LFMs shows that FIC of LFMs has...
متن کاملFactorized Asymptotic Bayesian Hidden Markov Models
This paper addresses the issue of model selection for hidden Markov models (HMMs). We generalize factorized asymptotic Bayesian inference (FAB), which has been recently developed for model selection on independent hidden variables (i.e., mixture models), for time-dependent hidden variables. As with FAB in mixture models, FAB for HMMs is derived as an iterative lower bound maximization algorithm...
متن کاملFactorized Asymptotic Bayesian Policy Search for POMDPs
This paper proposes a novel direct policy search (DPS) method with model selection for partially observed Markov decision processes (POMDPs). DPSs have been standard for learning POMDPs due to their computational efficiency and natural ability to maximize total rewards. An important open challenge for the best use of DPS methods is model selection, i.e., determination of the proper dimensionali...
متن کاملA One-Stage Two-Machine Replacement Strategy Based on the Bayesian Inference Method
In this research, we consider an application of the Bayesian Inferences in machine replacement problem. The application is concerned with the time to replace two machines producing a specific product; each machine doing a special operation on the product when there are manufacturing defects because of failures. A common practice for this kind of problem is to fit a single distribution to the co...
متن کاملNon-Gaussian Statistical Models and Their Applications
Statistical modeling plays an important role in various research areas. It provides a way to connect the data with the statistics. Based on the statistical properties of the observed data, an appropriate model can be chosen that leads to a promising practical performance. The Gaussian distribution is the most popular and dominant probability distribution used in statistics, since it has an anal...
متن کامل